A note on entropy of logic

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A note on inequalities for Tsallis relative operator entropy

‎In this short note‎, ‎we present some inequalities for relative operator entropy which are generalizations of some results obtained by Zou [Operator inequalities associated with Tsallis relative operator entropy‎, ‎{em Math‎. ‎Inequal‎. ‎Appl.}‎ ‎{18} (2015)‎, ‎no‎. ‎2‎, ‎401--406]‎. ‎Meanwhile‎, ‎we also show some new lower and upper bounds for relative operator entropy and Tsallis relative o...

متن کامل

A Note on the Bivariate Maximum Entropy Modeling

Let X=(X1 ,X2 ) be a continuous random vector. Under the assumption that the marginal distributions of X1 and X2 are given, we develop models for vector X when there is partial information about the dependence structure between X1  and X2. The models which are obtained based on well-known Principle of Maximum Entropy are called the maximum entropy (ME) mo...

متن کامل

The concept of logic entropy on D-posets

In this paper, a new invariant called {it logic entropy} for dynamical systems on a D-poset is introduced. Also, the {it conditional logical entropy} is defined and then some of its properties are studied.  The invariance of the {it logic entropy} of a system  under isomorphism is proved. At the end,  the notion of an $ m $-generator of a dynamical system is introduced and a version of the Kolm...

متن کامل

A note on Shannon entropy

We present a somewhat different way of looking on Shannon entropy. This leads to an axiomatisation of Shannon entropy that is essentially equivalent to that of Fadeev.

متن کامل

A Note on Entropy Estimation

We compare an entropy estimator H(z) recently discussed by Zhang (2012) with two estimators, H(1) and H(2), introduced by Grassberger (2003) and Schürmann (2004). We prove the identity H(z) ≡ H(1), which has not been taken into account by Zhang (2012). Then we prove that the systematic error (bias) of H(1) is less than or equal to the bias of the ordinary likelihood (or plug-in) estimator of en...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: YUJOR

سال: 2017

ISSN: 0354-0243,1820-743X

DOI: 10.2298/yjor151025011b